354 research outputs found

    Probing the basins of attraction of a recurrent neural network

    Full text link
    A recurrent neural network is considered that can retrieve a collection of patterns, as well as slightly perturbed versions of this `pure' set of patterns via fixed points of its dynamics. By replacing the set of dynamical constraints, i.e., the fixed point equations, by an extended collection of fixed-point-like equations, analytical expressions are found for the weights w_ij(b) of the net, which depend on a certain parameter b. This so-called basin parameter b is such that for b=0 there are, a priori, no perturbed patterns to be recognized by the net. It is shown by a numerical study, via probing sets, that a net constructed to recognize perturbed patterns, i.e., with values of the connections w_ij(b) with b unequal zero, possesses larger basins of attraction than a net made with the help of a pure set of patterns, i.e., with connections w_ij(b=0). The mathematical results obtained can, in principle, be realized by an actual, biological neural net.Comment: 17 pages, LaTeX, 2 figure

    A recurrent neural network with ever changing synapses

    Full text link
    A recurrent neural network with noisy input is studied analytically, on the basis of a Discrete Time Master Equation. The latter is derived from a biologically realizable learning rule for the weights of the connections. In a numerical study it is found that the fixed points of the dynamics of the net are time dependent, implying that the representation in the brain of a fixed piece of information (e.g., a word to be recognized) is not fixed in time.Comment: 17 pages, LaTeX, 4 figure

    Controlling chaos in diluted networks with continuous neurons

    Full text link
    Diluted neural networks with continuous neurons and nonmonotonic transfer function are studied, with both fixed and dynamic synapses. A noisy stimulus with periodic variance results in a mechanism for controlling chaos in neural systems with fixed synapses: a proper amount of external perturbation forces the system to behave periodically with the same period as the stimulus.Comment: 11 pages, 8 figure

    Through-membrane electron-beam lithography for ultrathin membrane applications

    Full text link
    We present a technique to fabricate ultrathin (down to 20 nm) uniform electron transparent windows at dedicated locations in a SiN membrane for in situ transmission electron microscopy experiments. An electron-beam (e-beam) resist is spray-coated on the backside of the membrane in a KOH- etched cavity in silicon which is patterned using through-membrane electron-beam lithography. This is a controlled way to make transparent windows in membranes, whilst the topside of the membrane remains undamaged and retains its flatness. Our approach was optimized for MEMS-based heating chips but can be applied to any chip design. We show two different applications of this technique for (1) fabrication of a nanogap electrode by means of electromigration in thin free-standing metal films and (2) making low-noise graphene nanopore devices

    Derivation of Hebb's rule

    Full text link
    On the basis of the general form for the energy needed to adapt the connection strengths of a network in which learning takes place, a local learning rule is found for the changes of the weights. This biologically realizable learning rule turns out to comply with Hebb's neuro-physiological postulate, but is not of the form of any of the learning rules proposed in the literature. It is shown that, if a finite set of the same patterns is presented over and over again to the network, the weights of the synapses converge to finite values. Furthermore, it is proved that the final values found in this biologically realizable limit are the same as those found via a mathematical approach to the problem of finding the weights of a partially connected neural network that can store a collection of patterns. The mathematical solution is obtained via a modified version of the so-called method of the pseudo-inverse, and has the inverse of a reduced correlation matrix, rather than the usual correlation matrix, as its basic ingredient. Thus, a biological network might realize the final results of the mathematician by the energetically economic rule for the adaption of the synapses found in this article.Comment: 29 pages, LaTeX, 3 figure

    Damage spreading in the mode-coupling equations for glasses

    Full text link
    We examine the problem of damage spreading in the off-equilibrium mode coupling equations. The study is done for the spherical pp-spin model introduced by Crisanti, Horner and Sommers. For p>2p>2 we show the existence of a temperature transition T0T_0 well above any relevant thermodynamic transition temperature. Above T0T_0 the asymptotic damage decays to zero while below T0T_0 it decays to a finite value independent of the initial damage. This transition is stable in the presence of asymmetry in the interactions. We discuss the physical origin of this peculiar phase transition which occurs as a consequence of the non-linear coupling between the damage and the two-time correlation functions.Comment: 5 pages, 2 figures, Revtex fil

    Dynamical TAP approach to mean field glassy systems

    Full text link
    The Thouless, Anderson, Palmer (TAP) approach to thermodynamics of mean field spin-glasses is generalised to dynamics. A method to compute the dynamical TAP equations is developed and applied to the p-spin spherical model. In this context we show to what extent the dynamics can be represented as an evolution in the free energy landscape. In particular the relationship between the long-time dynamics and the local properties of the free energy landscape shows up explicitly within this approach. Conversely, by an instantaneous normal modes analysis we show that the local properties of the energy landscape seen by the system during its dynamical evolution do not change qualitatively at the dynamical transition.Comment: final version, 21 pages, 1 eps figur

    Hierarchical Self-Programming in Recurrent Neural Networks

    Full text link
    We study self-programming in recurrent neural networks where both neurons (the `processors') and synaptic interactions (`the programme') evolve in time simultaneously, according to specific coupled stochastic equations. The interactions are divided into a hierarchy of LL groups with adiabatically separated and monotonically increasing time-scales, representing sub-routines of the system programme of decreasing volatility. We solve this model in equilibrium, assuming ergodicity at every level, and find as our replica-symmetric solution a formalism with a structure similar but not identical to Parisi's LL-step replica symmetry breaking scheme. Apart from differences in details of the equations (due to the fact that here interactions, rather than spins, are grouped into clusters with different time-scales), in the present model the block sizes mim_i of the emerging ultrametric solution are not restricted to the interval [0,1][0,1], but are independent control parameters, defined in terms of the noise strengths of the various levels in the hierarchy, which can take any value in [0,\infty\ket. This is shown to lead to extremely rich phase diagrams, with an abundance of first-order transitions especially when the level of stochasticity in the interaction dynamics is chosen to be low.Comment: 53 pages, 19 figures. Submitted to J. Phys.

    Damage spreading transition in glasses: a probe for the ruggedness of the configurational landscape

    Get PDF
    We consider damage spreading transitions in the framework of mode-coupling theory. This theory describes relaxation processes in glasses in the mean-field approximation which are known to be characterized by the presence of an exponentially large number of meta-stable states. For systems evolving under identical but arbitrarily correlated noises we demonstrate that there exists a critical temperature T0T_0 which separates two different dynamical regimes depending on whether damage spreads or not in the asymptotic long-time limit. This transition exists for generic noise correlations such that the zero damage solution is stable at high-temperatures being minimal for maximal noise correlations. Although this dynamical transition depends on the type of noise correlations we show that the asymptotic damage has the good properties of an dynamical order parameter such as: 1) Independence on the initial damage; 2) Independence on the class of initial condition and 3) Stability of the transition in the presence of asymmetric interactions which violate detailed balance. For maximally correlated noises we suggest that damage spreading occurs due to the presence of a divergent number of saddle points (as well as meta-stable states) in the thermodynamic limit consequence of the ruggedness of the free energy landscape which characterizes the glassy state. These results are then compared to extensive numerical simulations of a mean-field glass model (the Bernasconi model) with Monte Carlo heat-bath dynamics. The freedom of choosing arbitrary noise correlations for Langevin dynamics makes damage spreading a interesting tool to probe the ruggedness of the configurational landscape.Comment: 25 pages, 13 postscript figures. Paper extended to include cross-correlation

    Diluted neural networks with adapting and correlated synapses

    Full text link
    We consider the dynamics of diluted neural networks with clipped and adapting synapses. Unlike previous studies, the learning rate is kept constant as the connectivity tends to infinity: the synapses evolve on a time scale intermediate between the quenched and annealing limits and all orders of synaptic correlations must be taken into account. The dynamics is solved by mean-field theory, the order parameter for synapses being a function. We describe the effects, in the double dynamics, due to synaptic correlations.Comment: 6 pages, 3 figures. Accepted for publication in PR
    corecore